#curse of multilinguality08/08/2025
Meta CLIP 2: Breaking the Curse of Multilinguality with Worldwide Image‑Text Pretraining
'Meta CLIP 2 trains CLIP models from scratch on native worldwide image-text pairs and shows that scaling metadata, curation, and model capacity reduces the curse of multilinguality while improving zero-shot and multilingual benchmark results.'